Iterative Least Squares Functional Networks Classifier
نویسندگان
چکیده
منابع مشابه
Iterative Reweighted Least Squares ∗
Describes a powerful optimization algorithm which iteratively solves a weighted least squares approximation problem in order to solve an L_p approximation problem. 1 Approximation Methods of approximating one function by another or of approximating measured data by the output of a mathematical or computer model are extraordinarily useful and ubiquitous. In this note, we present a very powerful ...
متن کاملDistributed Least-Squares Iterative Methods in Networks: A Survey
Many science and engineering applications involve solving a linear least-squares system formed from some field measurements. In the distributed cyber-physical systems (CPS), often each sensor node used for measurement only knows partial independent rows of the least-squares system. To compute the least-squares solution they need to gather all these measurement at a centralized location and then...
متن کاملUnifying Least Squares, Total Least Squares and Data Least Squares
The standard approaches to solving overdetermined linear systems Ax ≈ b construct minimal corrections to the vector b and/or the matrix A such that the corrected system is compatible. In ordinary least squares (LS) the correction is restricted to b, while in data least squares (DLS) it is restricted to A. In scaled total least squares (Scaled TLS) [15], corrections to both b and A are allowed, ...
متن کاملMatrix-pattern-oriented least squares support vector classifier with AdaBoost
Matrix-pattern-oriented Least Squares Support Vector Classifier (MatLSSVC) can directly classify matrix patterns and has a superior classification performance than its vector version Least Squares Support Vector Classifier (LSSVC) especially for images. However, it can be found that the classification performance of MatLSSVC is matrixization-dependent, i.e. heavily relying on the reshaping ways...
متن کاملPreconditioned Iterative Methods for Solving Linear Least Squares Problems
New preconditioning strategies for solving m × n overdetermined large and sparse linear least squares problems using the CGLS method are described. First, direct preconditioning of the normal equations by the Balanced Incomplete Factorization (BIF) for symmetric and positive definite matrices is studied and a new breakdown-free strategy is proposed. Preconditioning based on the incomplete LU fa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks
سال: 2007
ISSN: 1045-9227,1941-0093
DOI: 10.1109/tnn.2007.891632